14 research outputs found

    An extensible architecture for run-time monitoring of conversational web services

    No full text
    Trust in Web services will be greatly enhanced if these are subject to run-time verification, even if they were previously tested, since their context of execution is subject to continuous change; and services may also be upgraded without notifying their consumers in advance. Conversational Web services introduce added complexity when it comes to run-time verification, since they follow a conversation protocol and they have a state bound to the session of each consumer accessing them. Furthermore, conversational Web services have different policies on how they maintain their state. Access to states can be private or shared; and states may be transient or persistent. These differences must be taken into account when building a scalable architecture for run-time verification through monitoring. This paper, building on a previously proposed theoretical framework for run-time verification of conversational Web services, presents the design, implementation and validation of a novel run-time monitoring architecture for conversational services, which aims to provide a holistic monitoring framework enabling the integration of different verification tools. The architecture is validated by running a sequence of test scenarios, based on a realistic example. The experimental results revealed that the monitoring activities have a tolerable overhead on the operation of a Web service

    Enabling Proactive Adaptation through Just-in-time Testing of Conversational Services

    No full text
    Service-based applications (SBAs) will increasingly be composed of third-party services available over the Internet. Reacting to failures of those third-party services by dynamically adapting the SBAs will become a key enabler for ensuring reliability. Determining when to adapt an SBA is especially challenging in the presence of conversational (aka. stateful) services. A conversational service might fail in the middle of an invocation sequence, in which case adapting the SBA might be costly; e.g., due to the necessary state transfer to an alternative service. In this paper we propose just-in-time testing of conversational services as a novel approach to detect potential problems and to proactively trigger adaptations, thereby preventing costly compensation activities. The approach is based on a framework for online testing and a formal test-generation method which guarantees functional correctness for conversational services. The applicability of the approach is discussed with respect to its underlying assumptions and its performance. The benefits of the approach are demonstrated using a realistic example

    Leveraging Semantic Web Service Descriptions for Validation by Automated Functional Testing

    Get PDF
    Recent years have seen the utilisation of Semantic Web Service descriptions for automating a wide range of service-related activities, with a primary focus on service discovery, composition, execution and mediation. An important area which so far has received less attention is service validation, whereby advertised services are proven to conform to required behavioural specifications. This paper proposes a method for validation of service-oriented systems through automated functional testing. The method leverages ontology-based and rule-based descriptions of service inputs, outputs, preconditions and effects (IOPE) for constructing a stateful EFSM specification. The specification is subsequently utilised for functional testing and validation using the proven Stream X-machine (SXM) testing methodology. Complete functional test sets are generated automatically at an abstract level and are then applied to concrete Web services, using test drivers created from the Web service descriptions. The testing method comes with completeness guarantees and provides a strong method for validating the behaviour of Web services

    Constraint-based runtime prediction of SLA violations in service orchestrations

    Get PDF
    Service compositions put together loosely-coupled component services to perform more complex, higher level, or cross-organizational tasks in a platform-independent manner. Quality-of-Service (QoS) properties, such as execution time, availability, or cost, are critical for their usability, and permissible boundaries for their values are defined in Service Level Agreements (SLAs). We propose a method whereby constraints that model SLA conformance and violation are derived at any given point of the execution of a service composition. These constraints are generated using the structure of the composition and properties of the component services, which can be either known or empirically measured. Violation of these constraints means that the corresponding scenario is unfeasible, while satisfaction gives values for the constrained variables (start / end times for activities, or number of loop iterations) which make the scenario possible. These results can be used to perform optimized service matching or trigger preventive adaptation or healing

    X-Machine Based Testing for Cloud Services

    Get PDF
    In this article we present a tool designed for cloud service testing, able to generate test cases from a formal specification of the service, in form of a deterministic stream X-machine (DSXM) model. The paper summarizes the theoretical foundations of X-machine based testing and illustrates the usage of the developed tool on some examples. It shows in detail how the specification should be written, which are the design for test conditions it should satisfy, in order to assure the generation of high quality test suites for the cloud service

    SLAs for cross-layer adaptation and monitoring of service-based applications

    No full text
    Cross-layer adaptation and monitoring (CLAM) is an approach to the run-time quality assurance of service-based applications (SBAs). The aim of CLAM is to monitor the different layers of an SBA and correlate the monitoring results, such that in the event that a problem occurs an effective adaptation strategy is inferred for enacting a coordinated adaptation across all layers of the SBA. An important aspect of CLAM is the definition of the appropriate Service-Level Agreements (SLAs) for third party services utilised in the different layers of the SBAs. In this paper, we present insights into how to define SLAs for CLAM, by analysing SBAs in order to differentiate the third party business, software and infrastructure services utilised by the SBA. As a case study, we apply the analytical approach to an existing platform-as-a-service framework, which has been developed as an SBA and could benefit from CLAM. The analysis reveals the different third party services and their characteristics, as a precursor to defining SLAs. The case study successfully demonstrates how distinct SLAs for business, software and infrastructure services may be applied respectively in the BPM, SCC and SI layers of an SBA, to provide a flexible monitoring and adaptation response across layers

    JSXM: A Tool for Automated Test Generation

    No full text
    corecore